翻訳と辞書
Words near each other
・ Conditional disjunction
・ Conditional dismissal
・ Conditional election
・ Conditional entropy
・ Conditional event algebra
・ Conditional expectation
・ Conditional factor demands
・ Conditional fee
・ Conditional gene knockout
・ Conditional independence
・ Conditional jockey
・ Conditional limitation
・ Conditional logistic regression
・ Conditional loop
・ Conditional mood
Conditional mutual information
・ Conditional noble
・ Conditional operator
・ Conditional perfect
・ Conditional preservation of the saints
・ Conditional probability
・ Conditional probability distribution
・ Conditional probability table
・ Conditional proof
・ Conditional quantifier
・ Conditional quantum entropy
・ Conditional random field
・ Conditional Rebate
・ Conditional release
・ Conditional sale


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Conditional mutual information : ウィキペディア英語版
Conditional mutual information

In probability theory, and in particular, information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.
==Definition==
For discrete random variables X, Y, and Z, we define
:I(X;Y|Z) = \mathbb E_Z \big(I(X;Y)|Z\big)
= \sum_ p_Z(z) \sum_ \sum_
p_(x,y|z) \log \frac(y|z)},
where the marginal, joint, and/or conditional probability mass functions are denoted by p with the appropriate subscript. This can be simplified as
:I(X;Y|Z) = \sum_ \sum_ \sum_
p_(x,y,z) \log \frac(y,z)}.
Alternatively, we may write〔K. Makarychev et al. ''A new class of non-Shannon-type inequalities for entropies.'' Communications in Information and Systems, Vol. 2, No. 2, pp. 147–166, December 2002 (PDF )〕 in terms of joint and conditional entropies as
:I(X;Y|Z) = H(X,Z) + H(Y,Z) - H(X,Y,Z) - H(Z)
= H(X|Z) - H(X|Y,Z).
This can be rewritten to show its relationship to mutual information
:I(X;Y|Z) = I(X;Y,Z) - I(X;Z)
usually rearranged as the chain rule for mutual information
:I(X;Y,Z) = I(X;Z) + I(X;Y|Z)
Another equivalent form of the above is
:I(X;Y|Z) = H(Z|X) + H(X) + H(Z|Y) + H(Y) - H(Z|X,Y) - H(X,Y) - H(Z)
= I(X;Y) + H(Z|X) + H(Z|Y) - H(Z|X,Y) - H(Z)
Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference I(X;Y|Z) - I(X;Y), called the interaction information, may be positive, negative, or zero, but it is always true that
:I(X;Y|Z) \ge 0
for discrete, jointly distributed random variables ''X'', ''Y'', ''Z''. This result has been used as a basic building block for proving other inequalities in information theory, in particular, those known as Shannon-type inequalities.
Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence:
: I(X;Y|Z) = D_ p( Z=z ) D_ p( Y=y ) D_{\mathrm{KL}}(p(X,Z|y) \| p(X|Z)p(Z|y) ).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Conditional mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.